25 research outputs found

    Mixtures of Gaussian distributions under linear dimensionality reduction

    Full text link
    High dimensional spaces pose a serious challenge to the learning process. It is a combination of limited number of samples and high dimensions that positions many problems under the "curse of dimensionality", which restricts severely the practical application of density estimation. Many techniques have been proposed in the past to discover embedded, locally-linear manifolds of lower dimensionality, including the mixture of Principal Component Analyzers, the mixture of Probabilistic Principal Component Analyzers and the mixture of Factor Analyzers. In this paper, we present a mixture model for reducing dimensionality based on a linear transformation which is not restricted to be orthogonal. Two methods are proposed for the learning of all the transformations and mixture parameters: the first method is based on an iterative maximum-likelihood approach and the second is based on random transformations and fixed (non iterative) probability functions. For experimental validation, we have used the proposed model for maximum-likelihood classification of five "hard" data sets including data sets from the UCI repository and the authors' own. Moreover, we compared the classification performance of the proposed method with that of other popular classifiers including the mixture of Probabilistic Principal Component Analyzers and the Gaussian mixture model. In all cases but one, the accuracy achieved by the proposed method proved the highest, with increases with respect to the runner-up ranging from 0.2% to 5.2%

    Robust dimensionality reduction for human action recognition

    Full text link
    Human action recognition can be approached by combining an action-discriminative feature set with a classifier. However, the dimensionality of typical feature sets joint with that of the time dimension often leads to a curse-of-dimensionality situation. Moreover, the measurement of the feature set is subject to sometime severe errors. This paper presents an approach to human action recognition based on robust dimensionality reduction. The observation probabilities of hidden Markov models (HMM) are modelled by mixtures of probabilistic principal components analyzers and mixtures of t-distribution sub-spaces, and compared with conventional Gaussian mixture models. Experimental results on two datasets show that dimensionality reduction helps improve the classification accuracy and that the heavier-tailed t-distribution can help reduce the impact of outliers generated by segmentation errors. © 2010 Crown Copyright

    Compressive Sensing of time series for human action recognition

    Full text link
    Compressive Sensing (CS) is an emerging signal processing technique where a sparse signal is reconstructed from a small set of random projections. In the recent literature, CS techniques have demonstrated promising results for signal compression and reconstruction [9, 8, 1]. However, their potential as dimensionality reduction techniques for time series has not been significantly explored to date. To this aim, this work investigates the suitability of compressive-sensed time series in an application of human action recognition. In the paper, results from several experiments are presented: (1) in a first set of experiments, the time series are transformed into the CS domain and fed into a hidden Markov model (HMM) for action recognition; (2) in a second set of experiments, the time series are explicitly reconstructed after CS compression and then used for recognition; (3) in the third set of experiments, the time series are compressed by a hybrid CS-Haar basis prior to input into HMM; (4) in the fourth set, the time series are reconstructed from the hybrid CS-Haar basis and used for recognition. We further compare these approaches with alternative techniques such as sub-sampling and filtering. Results from our experiments show unequivocally that the application of CS does not degrade the recognition accuracy; rather, it often increases it. This proves that CS can provide a desirable form of dimensionality reduction in pattern recognition over time series. © 2010 Crown Copyright

    MLiT: Mixtures of Gaussians under linear transformations

    Full text link
    The curse of dimensionality hinders the effectiveness of density estimation in high dimensional spaces. Many techniques have been proposed in the past to discover embedded, locally linear manifolds of lower dimensionality, including the mixture of principal component analyzers, the mixture of probabilistic principal component analyzers and the mixture of factor analyzers. In this paper, we propose a novel mixture model for reducing dimensionality based on a linear transformation which is not restricted to be orthogonal nor aligned along the principal directions. For experimental validation, we have used the proposed model for classification of five "hard" data sets and compared its accuracy with that of other popular classifiers. The performance of the proposed method has outperformed that of the mixture of probabilistic principal component analyzers on four out of the five compared data sets with improvements ranging from 0. 5 to 3.2%. Moreover, on all data sets, the accuracy achieved by the proposed method outperformed that of the Gaussian mixture model with improvements ranging from 0.2 to 3.4%. © 2011 Springer-Verlag London Limited

    Role of pulmonary intravascular macrophages in endotoxin-induced lung inflammation and mortality in a rat model

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Bile-duct ligated (BDL) rats recruit pulmonary intravascular macrophages (PIMs) and are highly susceptible to endotoxin-induced mortality. The mechanisms of this enhanced susceptibility and mortality in BDL rats, which are used as a model of hepato-pulmonary syndrome, remain unknown. We tested a hypothesis that recruited PIMs promote endotoxin-induced mortality in a rat model.</p> <p>Methods</p> <p>Rats were subjected to BDL to induce PIM recruitment followed by treatment with gadolinium chloride (GC) to deplete PIMs. Normal and BDL rats were treated intravenously with <it>E. coli </it>lipopolysaccharide (LPS) with or without GC pre-treatment followed by collection and analyses of lungs for histopathology, electron microscopy and cytokine quantification.</p> <p>Results</p> <p>BDL rats recruited PIMs without any change in the expression of IL-1β, TNF-α and IL-10. GC caused reduction in PIMs at 48 hours post-treatment (P < 0.05). BDL rats treated intravenously with <it>E. coli </it>LPS died within 3 hours of the challenge while the normal LPS-treated rats were euthanized at 6 hours after the LPS treatment. GC treatment of rats 6 hours or 48 hours before LPS challenge resulted in 80% (1/5) and 100% (0/5) survival, respectively, at 6 hours post-LPS treatment. Lungs from BDL+LPS rats showed large areas of perivascular hemorrhages compared to those pre-treated with GC. Concentrations of IL-1β, TNF-α and IL-10 were increased in lungs of BDL+LPS rats compared to BDL rats treated with GC 48 hours but not 6 hours before LPS (P < 0.05).</p> <p>Conclusion</p> <p>We conclude that PIMs increase susceptibility for LPS-induced lung injury and mortality in this model, which is blocked by a reduction in their numbers or their inactivation.</p

    Does time of surgery influence the rate of false-negative appendectomies?:A retrospective observational study of 274 patients

    Get PDF
    Background Multiple disciplines have described an “after-hours effect” relating to worsened mortality and morbidity outside regular working hours. This retrospective observational study aimed to evaluate whether diagnostic accuracy of a common surgical condition worsened after regular hours. Methods Electronic operative records for all non-infant patients (age > 4 years) operated on at a single centre for presumed acute appendicitis were retrospectively reviewed over a 56-month period (06/17/2012–02/01/2017). The primary outcome measure of unknown diagnosis was compared between those performed in regular hours (08:00–17:00) or off hours (17:01–07:59). Pre-clinical biochemistry and pre-morbid status were recorded to determine case heterogeneity between the two groups, along with secondary outcomes of length of stay and complication rate. Results Out of 289 procedures, 274 cases were deemed eligible for inclusion. Of the 133 performed in regular hours, 79% were appendicitis, compared to 74% of the 141 procedures performed off hours. The percentage of patients with an unknown diagnosis was 6% in regular hours compared to 15% off hours (RR 2.48; 95% CI 1.14–5.39). This was accompanied by increased numbers of registrars (residents in training) leading procedures off hours (37% compared to 24% in regular hours). Pre-morbid status, biochemistry, length of stay and post-operative complication rate showed no significant difference. Conclusions This retrospective study suggests that the rate of unknown diagnoses for acute appendicitis increases overnight, potentially reflecting increased numbers of unnecessary procedures being performed off hours due to poorer diagnostic accuracy. Reduced levels of staffing, availability of diagnostic modalities and changes to workforce training may explain this, but further prospective work is required. Potential solutions may include protocolizing the management of common acute surgical conditions and making more use of non-resident on call senior colleagues

    Mortality following Stroke, the Weekend Effect and Related Factors: Record Linkage Study

    Get PDF
    Increased mortality following hospitalisation for stroke has been reported from many but not all studies that have investigated a 'weekend effect' for stroke. However, it is not known whether the weekend effect is affected by factors including hospital size, season and patient distance from hospital.To assess changes over time in mortality following hospitalisation for stroke and how any increased mortality for admissions on weekends is related to factors including the size of the hospital, seasonal factors and distance from hospital.A population study using person linked inpatient, mortality and primary care data for stroke from 2004 to 2012. The outcome measures were, firstly, mortality at seven days and secondly, mortality at 30 days and one year.Overall mortality for 37 888 people hospitalised following stroke was 11.6% at seven days, 21.4% at 30 days and 37.7% at one year. Mortality at seven and 30 days fell significantly by 1.7% and 3.1% per annum respectively from 2004 to 2012. When compared with week days, mortality at seven days was increased significantly by 19% for admissions on weekends, although the admission rate was 21% lower on weekends. Although not significant, there were indications of increased mortality at seven days for weekend admissions during winter months (31%), in community (81%) rather than large hospitals (8%) and for patients resident furthest from hospital (32% for distances of >20 kilometres). The weekend effect was significantly increased (by 39%) for strokes of 'unspecified' subtype.Mortality following stroke has fallen over time. Mortality was increased for admissions at weekends, when compared with normal week days, but may be influenced by a higher stroke severity threshold for admission on weekends. Other than for unspecified strokes, we found no significant variation in the weekend effect for hospital size, season and distance from hospital

    Violence and post-traumatic stress disorder in Sao Paulo and Rio de Janeiro, Brazil: the protocol for an epidemiological and genetic survey

    Get PDF
    Background: violence is a public health major concern, and it is associated with post-traumatic stress disorder and other psychiatric outcomes. Brazil is one of the most violent countries in the world, and has an extreme social inequality. Research on the association between violence and mental health may support public health policy and thus reduce the burden of disease attributable to violence. the main objectives of this project were: to study the association between violence and mental disorders in the Brazilian population; to estimate the prevalence rates of exposure to violence, post-traumatic stress disorder, common metal disorder, and alcohol hazardous use and dependence: and to identify contextual and individual factors, including genetic factors, associated with the outcomes.Methods/design: one phase cross-sectional survey carried out in São Paulo and Rio de Janeiro, Brazil. A multistage probability to size sampling scheme was performed in order to select the participants (3000 and 1500 respectively). the cities were stratified according to homicide rates, and in São Paulo the three most violent strata were oversampled. the measurements included exposure to traumatic events, psychiatric diagnoses (CIDI 2.1), contextual (homicide rates and social indicators), and individual factors, such as demographics, social capital, resilience, help seeking behaviours. the interviews were carried between June/2007 February/2008, by a team of lay interviewers. the statistical analyses will be weight-adjusted in order to take account of the design effects. Standardization will be used in order to compare the results between the two centres. Whole genome association analysis will be performed on the 1 million SNP (single nucleotide polymorphism) arrays, and additional association analysis will be performed on additional phenotypes. the Ethical Committee of the Federal University of São Paulo approved the study, and participants who matched diagnostic criteria have been offered a referral to outpatient clinics at the Federal University of São Paulo and Federal University of Rio de Janeiro

    Neuro-fuzzy Learning Applied to Improve the Trajectory Reconstruction Problem

    Full text link
    This paper presents the application of a neuro-fuzzy learning approach to classify Air Traffic Control (ATC) trajectory segments from recorded opportunity traffic. This method learns a fuzzy system using neural-network theory to determine its parameters (fuzzy sets and fuzzy rules) by processing data samples. The problem is prepared for analysing the Markovchain probabilities estimated by an Interacting Multiple Model (IMM) tracking filter operating forward and backward over available data. The performance of this data-driven classification system is compared with a more conventional approach based on transition detection on simulated and real data of representative situations. The problem's formulation for this application enabled an accurate classification of manoeuvring segments and the derivation of rules that explain the relation between input attributes and motion categories used to describe the recorded data
    corecore